Taming Prometheus: Talk About Safety and Culture
نویسنده
چکیده
Talk of safety culture has emerged as a common trope in contemporary scholarship and popular media as an explanation for accidents and as a recipe for improvement in complex sociotechnical systems. Three conceptions of culture appear in talk about safety: culture as causal attitude, culture as engineered organization, and culture as emergent and indeterminate. If we understand culture as sociologists and anthropologists theorize as an indissoluble dialectic of system and practice, as both the product and context of social action, the first two perspectives deploying standard causal logics fail to provide persuasive accounts. Displaying affinities with individualist and reductionist epistemologies, safety culture is frequently operationalized in terms of the attitudes and behaviors of individual actors, often the lowest-level actors, with the least authority, in the organizational hierarchy. Sociological critiques claim that culture is emergent and indeterminate and cannot be instrumentalized to prevent technological accidents. Research should explore the features of complex systems that have been elided in the talk of safety culture: normative heterogeneity and conflict, inequalities in power and authority, and competing sets of legitimate interests within organizations. 341 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 “. . .the darkest and most treacherous of all the countries . . . lie in the tropic between intentions and actions. . . .” —Chabon (2008, p. 29) Rescuing Prometheus, by the venerable historian of technology Thomas Hughes (1998), describes how four large post–World War II projects revolutionized the aerospace, computing, and communication industries by transforming bureaucratic organizations into postmodern technological systems. In place of centralized hierarchies of tightly coupled homogeneous units typical of traditional corporate and military organizations, Hughes describes the invention of loosely coupled networks of heterogeneously distributed and often collegially connected communities of diverse participants. By the 1980s and 1990s, new modes of management and design—public participation coupled with commitments to environmental repair and protection—overcame what had been intensifying resistance to large-scale, often government sponsored, technologies. “Prometheus the creator,” Hughes (1998, p. 14) writes, “once restrained by defense projects sharply focused upon technical and economic problems, is now free to embrace the messy environmental, political, and social complexity of the postindustrial world.” If the engineering accomplishments of the past 40 years signify a resuscitated capacity to mobilize natural and human resources to produce, distribute, and accumulate on historically unprecedented scales, proliferating interest in safety culture may signal renewed efforts to tame Prometheus. In the past 20 years, a new way of talking about the consequences of complex organizations and sociotechnical systems1 has developed. Although culture is a common sociological subject, those talking about safety culture often invoke the iconic concept with 1“The notion of a sociotechnical system stresses the close interdependence of both the technological artifacts and behavioral resources (individual, group, and organizational) necessary for the operation of any large-scale technology” (Pidgeon 1991, p. 131). little of the theoretical edifice sociologists and anthropologists have built for cultural analysis. Decades after the social sciences reconceptualized culture as “the medium of lived experience” ( Jacobs & Hanrahan 2005, p. 1), a normatively plural system of symbols and meanings that both enables and constrains social practice and action (Sewell 2005, pp. 152– 75; Silbey 2001; 2005a, p. 343), the cultural turn has taken root in the military and engineering professions, and for similar reasons: human action and culture getting in the way of technological efficiency. However, unlike the military’s embrace of culture where critique confronts its every move (Gusterson 2007), efforts to propagate safety culture in complex technological systems proceed with scant attention to its ideological implications. Despite the appropriation of the term culture, many advocates and scholars of technological innovation and management deploy distinctly instrumental and reductionist epistemologies antithetical to cultural analysis. We can be protected from the consequences of our very effective instrumental rationalist logics and safety can be achieved, they seem to suggest, by attending to what advocates of safety culture treat as an ephemeral yet manageable residue of human intercourse— something akin to noise in the system. How are we to understand this unexpected and unusual appropriation of the central term of the soft sciences by the experts of the hard, engineering sciences? This article reviews popular talk and scholarship about safety culture. Since the 1990s, identifying broken or otherwise damaged safety culture has become a familiar explanation for organizational and technological failures. Although the term safety culture has been deployed across institutional sites and scholarly fields, it is largely absent from sociological scholarship. Sociologists studying accidents and disasters provide a more critical and skeptical view of safety culture, if they address it at all (e.g., Beamish 2002; Clarke 1989, 1999, 2006; Gieryn & Figert 1990; Hilgartner 1992; Perin 2005; Perrow 1999 [1984], 2007; Vaughan 1996). However, in engineering and 342 Silbey A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 management scholarship, the term safety culture is invoked with increasing frequency and seems to refer to a commonly shared, stable set of practices in which all members of an organization learn from errors to minimize risk and maximize safety in the performance of organizational tasks and the achievement of production goals. In this review, I argue that the endorsement of safety culture can be usefully understood as a way of encouraging and allocating responsibility (Shamir 2008)—one response to the dangers of technological systems. Invoking culture as both the explanation and remedy for technological disasters obscures the different interests and power relations enacted in complex organizations. Although it need not, talk about culture often focuses attention primarily on the lowlevel workers who become responsible, in the last instance, for organizational consequences, including safety. Rather than forgoing particularly dangerous technologies or doing less in order to reduce vulnerabilities to natural, industrial, or terrorist catastrophes, talk about safety culture reinforces investments in complex, hard to control systems as necessary and manageable, as well as highly profitable (for a few), although unavoidably and unfortunately dangerous (for many) (Perrow 2007). At the same time, talk of safety culture suggests that the risks associated with increased efficiency and profitability can be responsibly managed and contained. The literature on safety culture traces its provenance to the copious work on risk assessment and systems analysis, system dynamics, and systems engineering that became so prevalent over the past 30 years.2 At the outset, paying attention to culture seems an important and valuable modification to what can be overly abstract and asocial theories of work and organization. Despite this important correction, research on safety culture usually ignores the historicalpolitical context, the structural relationships, 2Risk and systems analysis pervades contemporary organizations from manufacturing, transportation, and communications to finance, health, and education. and the interdependencies that are essential to cultural and organizational performances and analyses. This review first provides a historical framing for talk about safety culture because that perspective is most clearly missing in much of the research. I suggest that talk about safety culture emerges alongside market discourse that successfully challenged the previous centuries’ mechanisms for distributing and mitigating technological risks. In the second section, I describe the more than fourfold increase in references to safety culture that appeared in popular and academic literature between 2000 and 2007. Organizing the work in terms of three commonly deployed conceptions, I then describe culture as causal attitude, as engineered organization, and as emergent. Relying on a conception of culture as an indissoluble dialectic of system and practice, both a product and context of social action, I argue that the first two perspectives not only fail to provide persuasive accounts, but reproduce individualist and reductionist epistemologies that are unable to reliably explain social or system performance. Although invocation of safety culture seems to recognize and acknowledge systemic processes and effects, it is often conceptualized to be measurable and malleable in terms of the attitudes and behaviors of individual actors, often the lowest-level actors, with least authority, in the organizational hierarchy. The third category of culture as emergent and indeterminate critiques claims that safety culture can be confidently instrumentalized to prevent catastrophic outcomes from complex technologies. This section suggests that future research on safety in complex systems should explore just those features of complex systems that are elided in the talk of safety culture: normative heterogeneity and cultural conflict, competing sets of interests within organizations, and inequalities in power and authority. Rather than imagine complex yet homogeneous local cultures, research should explore how struggles among competing interests are part of the processes of cultural production and how normative heterogeneity, structured competition, and countervailing www.annualreviews.org • Safety and Culture 343 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 centers of power can contribute to, rather than undermine, safer technologies. HISTORICAL SHIFTS: CONSTRUCTING AND DECONSTRUCTING SAFETY NETS Why has attention to safety culture arisen at this historical moment? Any answer must begin by acknowledging the technological catastrophes of the past 40 years: Three Mile Island, Bhopal, Chernobyl, the Challenger and Columbia accidents at NASA, the Exxon Valdez oil spill, oil rig accidents, Buffalo Creek, contaminated blood transfusions, and a host of less spectacular disasters (Ballard 1988; Davidson 1990; Erikson 1978; Fortun 2001; Jasanoff 1994; Keeble 1991; Kurzman 1987; Medvedev 1992; Petryna 2002; Rees 1994; Setbon 1993; Stephens 1980; Stern 1976/2008; Vaughan 1996, 2003, 2006; Walker 2004). In each instance, the accident was usually explained as just that, an accident—not a system or design failure, but the result of some extraneous mistake or mismanagement of a basically well-conceived technology. Because the systems in which the accidents occurred are omnipresent, the recurring accidents undermine confidence that catastrophes can be avoided. Alongside concerns about genetically modified foods, the toxicity of commonly used household products, the migration of various synthetic compounds from plants through animals into the human body, the rapid spread of disease and contamination through porous and swift global transportation routes, and humanproduced environmental degradation, technological accidents feed a deepening mistrust of science ( Jasanoff 2005). If, as Hughes (1998) suggests, the invention of postmodern systems rescued Prometheus from the technological disillusionment of the 1960s and 1970s, perhaps the promotion of safety culture responds to a renewed technological skepticism in the twenty-first century. However, accidents alone cannot be driving the recent attention to safety culture. Technological accidents are not new phenomena, and safety has been a lively concern since the middle of the nineteenth century, if not earlier. Indeed, in some accounts, much of the regulatory apparatus of the modern state was institutionalized to protect against the injurious consequences of industrial production by setting minimally safe conditions of work, establishing private actions at law, and spreading the risks (of what could not be prevented) through fair trade practices, workmen’s compensation, and pension systems, as well as labor unions, private mutual help, and insurance. Safety was one of several objectives promoted by the system of instruments regulating relations between capital and labor (cf. Baker 2002, Ewald 2002, Friedman 1967, Orren 1991, Welke 2001, Witt 2004). In a sense, the invention of risk,3 and with it widespread insurance and regulation of workplaces, products, and markets, created the basis of a new social contract. Responsibility was transferred from the person to the situation— the job, the firm, the union, or the collective nation—forgoing reliance on any individual’s behavior, whether worker or boss. Eschewing interest in specific causality, and thus individual liability, this collectivized regime acknowledged a general source of insecurity in technology and responded with a set of generalized responses, albeit after extended and sometimes tragic struggle. Where responsibility had previously rested on the idea of proximate cause and a selective distribution of costs based on liability as a consequence of imprudence, the late nineteenth and early twentieth century industrial and business regulation redistributed costs to collectivities, offering compensation and reparation, if not safety and security. Responsibility was “no longer the attribute of a subject, but rather a consequence of a social 3Accounts vary as to the moment when probabilistic calculation about hazardous events became a recognized practice (see Hacking 1990, 2003). 344 Silbey A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 fact” (Ewald 2002, p. 279). One was no longer “responsible because one is free by nature and could therefore have acted differently, but because society judges it ‘fair’” to place responsibility in a particular social location, that is, to cause a particular person or collectivity to bear the financial costs of the injury. In short, the costs of technological consequences were dispersed, “the source and foundation of responsibility . . . displaced from the individual onto society” (p. 279). Talk about safety culture offers a new twist, or possible reversion, in the allocation of responsibility for technological failures, a return to the nineteenth century and earlier regimes of individual responsibility, but in a context of more hazardous and global technologies. After several decades of sustained attack by advocates seeking supposedly more efficient and just allocations of goods through unregulated markets, the regime of collective responsibility has been dismantled, replaced by one of institutional flexibility. Rather than attempting to mitigate and distribute risk, contemporary policies and practices embrace risk (Baker & Simon 2002, p. 1). Embracing risk means to “conceive and address social problems in terms of risk”—calculated probability of hazard (Heimer 1988, Simon 1988). Human life, including the prospects of human autonomy and agency, is now conceived in very much the same way and analyzed with the same tools we employ to understand and manipulate physical matter: ordered in all its important aspects by instrumental and probabilistic calculation and mechanical regulation (Bittner 1983). Unfortunately, risk analysis and discourse narrow consideration of legitimate alternatives while nonetheless sustaining the appearance of broad pluralism (cf. Habermas 1975). Because of the assumption that realism resides exclusively in science, reflexive observation and critique as well as unmeasured variables are excluded from official risk discourses. As a consequence, allegedly empirical analyses become solipsistic, focusing exclusively on the methods and epistemologies that are internal to technological instrumentalism (Deutch & Lester 2004; Lash & Wynne 1992, p. 4). Heimer (1985) identified the illusory nature of this supposed realism in her prescient analysis of the reactive nature of risk, demonstrating how risk (probabilities of threats to safety and security) would necessarily elude our grasp because each effort to control risk transformed its probabilities in an ever-escalating spiral. Embracing risk also refers to the specific policies and techniques instituted over the past several decades to undo the system of collective security. “Across a wide range of institutions, officials are now as concerned about the perverse effects of . . . risk shifting [i.e., risk sharing], as they are about the risks [probabilities of hazard] being shifted” (Baker & Simon 2002, p. 4). In place of the regime of risk containment, proponents of flexibility argue that safety and security can be achieved more effectively by embracing and privatizing risk. Although pro-privatization market policies that attempt to “make people more individually accountable for risk” (Baker & Simon 2002, p. 1) are often justified as natural and efficient, there is nothing natural about them (Klein 2007, Mackenzie 2006). Just as risk-spreading was achieved through the efforts of financial and moral entrepreneurs to transform common, often religious, conceptions of morality, responsibility, and money (Becker 1963; Zelizer 1979, 1997), contemporary risk-embracing policies are also the outcome of ideological struggles. If in the nineteenth century marketing life insurance required a modification in what it meant to protect one’s family by providing materially for them after death rather than seeming to earn a profit from death, so too risk-embracing policies in the twentieth and twenty-first centuries require a similar redefinition in what it means to be responsible, productive citizens. Contemporary moral entrepreneurs energetically promote risk taking rather than risk sharing as morally desirable; the individual more effectively provides for family security, it is claimed, by participating in a competitive, expanding, market economy than by relying on government-constructed safety nets. www.annualreviews.org • Safety and Culture 345 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 This moral entrepreneurship directs our attention to safety culture because the concept arises as a means of managing technological risk, just as the previous security regime has been successfully dismantled. This is not to say that the nineteenth to twentieth century regulatory system was perfect, nor as good as it might have been, nor that it prevented or repaired all or most technological damage. It was, however, a means of distributing, if not preventing, the costs of injuries. Yet, for most of the twentieth century, risk analysts themselves expended a good part of their energy attacking this system, legitimating the risks undertaken, reassuring the public that they were nonetheless being protected, and second-guessing the regulatory “agencies’ attempts to do a very difficult job” (Perrow 1999 [1984], p. 307). Paradoxically, many risk analysts regularly assessed the risks of regulation more negatively than the risks of the hazards themselves (e.g., Deutch & Lester 2004). With a commitment to the idea of efficient markets, critics of regulation produced accounts of government regulation as publicly sanctioned coercion sought by private firms to consolidate market power, inhibit price competition, and limit entry. As a result, critics argued, the system produced inefficiencies, a lack of price competition, higher costs, and overcapitalization ( Joskow & Noll 1977, Joskow & Rose 1989; cf. Schneiberg & Bartley 2008). Interestingly, these challenges to government regulation rarely valued as highly consumer service, product quality, and environmental protection that were also promoted by regulation. The accounts of corporate capture undermining regulatory effectiveness (Bernstein 1955; Derthick & Quirk 1985; Peltzman et al. 1989; Vogel 1981, 1986) also ignored the new social regulation in safety, consumer protection, and civil rights. Perhaps the focus on market control, and a latent hostility to the struggles between labor and capital and between manufacturers and consumers that became ideologically entwined with the struggles against regulation, blinded scholars to non-economic variables such as safety that had also been part of the regulatory regime. For whatever reasons, ideological or coincidental, the focus on market competition as the central guarantor of productivity and efficiency overlooked constituent structural features of the regime of government regulation, insurance, and liability that mitigated risk by promoting countervailing interests in safety and responsibility. Notably, the nineteenth to twentieth century solidarity regime was “not only a paradigm of compensation but also one of prevention” (Ewald 2002, p. 281). Bottom-line profit taking required diligent efforts not simply to estimate costs and prices but also to prevent losses, that is, accidents and disasters. A host of institutional practices and organizations promoted responsibility by enacting prevention, in this way reducing costs and increasing profit. For example, the great life insurance companies were pioneers in epidemiology and public health. The fire insurance industry formed Underwriter’s Laboratories, which tests and certifies the safety of household appliances and other electrical equipment. Insurance companies seeking to cut their fire losses formed the first fire departments. More recently, health insurance companies have been behind many efforts to compare, test, and measure the effectiveness of medical procedures (Baker & Simon 2002, p. 8; cf. Knowles 2007a,b). Under the solidarity regime, industries, individual firms, and labor unions collectively promoted forms of social control, workplace discipline, and self-governance that were expected to reduce injuries and thus costs for the various organizations (Ericson et al. 2003). Minimally, they identified the worst offenders. Insurance companies have traditionally also taken precautions to mitigate financial losses not only through safer practices but through investment of premiums and reinsurance. The post-1929 American banking and financial industry regulations purposively segregated different financial functions and markets to prevent excessive losses in one activity from contaminating related industries and parallel silos in the financial markets. However, since 346 Silbey A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 the systematic deconstruction of this regulatory regime began in the 1980s, insurance firms, like many corporations, have become ever more financialized, earning profit more directly from investments in global financial markets than from selling insurance. With the invention of derivatives and similar instruments, a wider array of firms have been transformed into financial rather than productive entities. Financialization means that capital and business risks are disaggregated, recombined in heterogeneous assets that are bought and sold globally, and distributed among myriad other firms, shareholders, and markets. Losses in these assets are supposedly protected through insurance swaps. There is an indirect but substantial consequence for safety in this financialized system because there is less interest in the reliability of the specific products manufactured or services offered. Less financial risk means reduced attention to the associated practices that encourage risk prevention and enhance safety.4 Finally, we cannot ignore the role of civil litigation as part of the twentieth century solidarity regime and its twenty-first century demise. The expansion of rights and remedies that began slowly with the New Deal but grew rapidly post–World War II came with “a great a burst of legalization.” While “regulation proliferated, extending to aspects of life previously unsupervised by the state” (Galanter 2006, p. 4), civil litigation independently generated rights. Although some commentators describe this as a litigation explosion (Friedman 1985, Kagan 2003, Lieberman 1981), it is actually a shift: from contract litigation dominating in the nineteenth century to tort litigation predominating in the twentieth century (Galanter 1983). 4The financial downturn that escalated to a worldwide crisis in 2008 can be attributed in part to just these practices. In the financial markets, not only was the safety of the produced material goods less salient, but the safety or security of the financial assets was of less concern because of default swaps, hedging, and insurance on bets that finally unraveled. Rather than encouraging responsibility, the layered system of disaggregation and recombination buttressed by hedges and insurance undermined critical or responsible decision making. Although strict liability is not the generously absurd protector of irresponsibility that critics claim it to be (Burke 2004, Holtom & McCann 2004), there is no doubt that the twentieth century produced, “by any measure, a great deal more law” (Galanter 2006, p. 5). The legal profession exploded from 1 lawyer for every 627 Americans in 1960 to 1 lawyer for every 264 in 2006. Spending on law increased, as did celebration of lawyers and legal work in popular media and film ( J. Silbey 2001, 2004, 2005, 2007a,b). In canonical Newtonian fashion, the expansion of law caused an energetic backlash. The early and mid-twentieth century cries that the legal system “failed to provide justice to the weak—gave way to a responsive critique that the nation was afflicted by ‘too much law’” (Galanter 2006, p. 5, citing Galanter 1994). One alleged legal crisis followed another, from product liability to overcrowded courts to medical malpractice (Baker 2005). Calls for tort reform and informal dispute resolution as alternatives to litigation became common, the centerpiece of organized professional and political campaigns (Burke 2004, Silbey & Sarat 1988). With Ronald Reagan’s election to the U.S. presidency in 1980 and subsequent Republican presidents, nominees to the federal courts were systematically screened for their ideological conformity with a less law, less rights agenda. By September 2008, 60% of active federal judges with this agenda had been appointed, and, as a consequence, the federal courts have joined the movement to embrace risk, becoming another voice promoting individual, rather than shared, assumption of risk (Scherer 2005). Thus, from the middle nineteenth through the late twentieth centuries, industrial and insurance firms, individual families, the civil litigation system, governmental regulatory agencies, and labor unions built and sustained a safety net of collective responsibility; they reinforced each other within a tapestry of organizations and institutions whose interests competed, yet coalesced to support relatively safer practices. The demise of those structural components is precisely what underwrites the www.annualreviews.org • Safety and Culture 347 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 contemporary focus on safety culture as a means of managing technological hazards. If we do not have empowered regulatory agencies, judicial support for tort litigation, organized labor, and insurance companies with a financial interest in the safety and longevity of their customers, we have lost a good part of what made the previous paradigm work to the extent it did for as long as it did. Talk of safety culture flourishes at the very moment when advocates extend the logic of individual choice, self-governance, and rational action from the market to all social domains. Just as historic liberalism was “concerned with setting limits on the exercise of political or public authority, viewing unwarranted interventions in the market as harmful,” contemporary neoliberalism5 promotes markets “as a principle not only for limiting government but also for rationalizing authority and social relations in general” (Shamir 2006, p. 1). Through a process of so-called responsibilization, “predisposing social actors to assume responsibility for their actions” (Shamir 2008, p. 10), these policies simultaneously empower individuals to discipline themselves while distributing, as in the nineteenth century prudential regime, to each the costs of that discipline and the consequences for the lack thereof (Rose 1989, 1999). As a concept, responsibilization names efforts to both cultivate and trust the moral agency of rational actors as the foundation of individual and collective well-being (Shamir 2008, p. 11). Because the propagation and inculcation of safety culture is only one approach to enhancing the reliability and safety of complex technologies, it is not unreasonable to wonder whether safety culture, focused on individual participants’ self-determined contributions to the system as a whole, might not be described as an 5The term neoliberalism is conventionally used to refer to the policies advocating deregulation, privatization, and reliance on markets for both distribution and coordination, but also includes a set of fiscal, tax, and trade liberalization policies that is sometimes referred to as the Washington Consensus because of support by the International Monetary Fund and the World Bank. expression of responsibilization, this neoliberal technique of governance. Without necessarily intending to promote policies of deregulation and privatization, the celebration of safety culture as a means of managing the hazardous consequences of complex systems expresses what Weber described as an elective affinity, phenomena that do not necessarily cause one another but nonetheless vary together. In the next section, I explore calls for and accounts of safety culture to extract from this diverse literature the purported meanings and relationships of safety and responsibility. TALK ABOUT SAFETY CULTURE Between 2000 and 2007, academic literature and popular media exploded with references to safety culture. Over 2250 articles in newspapers, magazines, scholarly journals, and law reviews in an eight-year period included references to safety culture, whereas only 570 references were found in the prior decade. Before 1980, I could find no references in popular or academic literature.6 Although the unprecedented appearance and the rapidly escalating use of the concept seem to support my hypothesis of ideological affinities between talk about safety culture and the dismantling of the regulatory state, we should look more closely at what people say to interpret what they mean when they speak about safety culture. The earliest uses of safety culture in newspapers and popular media invoke the term primarily in discussions of nuclear power, energy generation, and weapons production to describe within organizations an “ingrained philosophy that safety comes first” (Diamond 1986). One non-nuclear reference to a British railroad accident is illustrative because, even in this less common venue, a deteriorating safety culture was offered as the explanation for what went 6I searched LexisNexis, JSTOR, and the Engineering Village databases for the years between 1945 and 2008, using the phrases safety culture, safety (and) culture, and culture of safety within two words of each other. 348 Silbey A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 wrong and should be improved to prevent future accidents.7 Mechanical error compounded by lax management processes was named as the cause of the accident. Nonetheless, the judge heading the accident inquiry focused his recommendations for improving the safety culture not on the management of the system or the communications processes within the railroad hierarchy, but on the laborers, calling for “radical improvements in recruiting and training and an end to excessive overtime” (Diamond 1986). Although talk about safety culture emerged during the 1980s when major accidents at Three Mile Island, Bhopal, and Chernobyl weakened public confidence in complex technologies, only well into the 1990s did talk about safety culture become a common phenomenon. Although thousands of newspaper articles were written about the March 28, 1979, partial meltdown of Unit 2 at the Three Mile Island nuclear power plant in Dauphin County, Pennsylvania, none spoke about the plant’s safety culture. We first see accounts of lax safety culture following the December 3, 1984, explosion of a Union Carbide plant synthesizing and packaging the pesticide methyl isocyanate in Bhopal in the Indian state of Madhya Pradesh. In these early references, the phrase is invoked primarily to denote culture in its more colloquially circulating meaning: to suggest that nations vary in their respect for safety. Because the Indian partners in the Union Carbide plant did not share the American culture (which implicitly valued safety), they were, by inference, responsible for the accident. John Holtzman, spokesman for the Chemical Manufacturers Association in Washington, DC, pointed to “the differences in ‘safety culture’ between the US and other countries. . . . We have a certain sense of safety. You see it in campaigns like ‘buckle up.’ It’s not necessarily the same 7During the morning rush hours of December 12, 1988, 35 people were killed and another 100 injured when “one commuter train rammed the rear of a stopped commuter train, outside busy Clapham Junction in south London. The wreckage was then struck by a freight train” (Associated Press 1989). elsewhere. It’s difficult to enforce our culture on another country,” Holtzman said, “especially when the other country seems willing to take risks in exchange for speedy technological advance” (Kiefer 1984). This use of safety culture to name variations in national cultures, reminiscent of historic justifications for colonial rule, did not stick. Very quickly, it became apparent that the preexisting safety problems in the Bhopal plant were not peculiar to Bhopal, or to India. Although Union Carbide had insisted that the conditions in Bhopal were unique, one of its sister plants in Institute, West Virginia, produced a similar accident just eight months later (Perrow 1999 [1984], p. 358). Although an Occupational Safety and Health Administration (OSHA) inspection had previously declared the West Virginia plant in good working order, the OSHA inspection following the explosion declared that this was “an accident waiting to happen,” citing hundreds of longstanding, “constant, willful, violations” (quoted in Perrow 1999 [1984], p. 359). Clearly, the different national cultures of India and the United States could not explain these accidents, which seemed to have had some other source. No one mentioned the role of lax inspections as part of the safety culture. With the exception of one story about how E.I. DuPont de Nemours & Co is “recognized within industry for its [exemplary] safety” practices (Brooks et al. 1986), the early references in popular media to safety culture do little more than invoke the term. They provide little specification of what activities, responsibilities, or symbolic representations contribute to a safety culture. In professional and scholarly literature, the phrase safety culture first appears in a 1986 report of the International Atomic Energy Agency (IAEA) on the Chernobyl accident. Three years later, a second reference by the U.S. Nuclear Regulatory Commission (1989) states that plant management “has a duty and obligation to foster the development of a ‘safety culture’ at each facility and throughout the facility, that assures safe operations.” After five years in common usage, an IAEA report defined www.annualreviews.org • Safety and Culture 349 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 safety culture as “that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear power safety issues receive attention warranted by their significance” (IAEA 1991, p. 8; 1992). As Perin (2005) comments in her detailed study of four nuclear power plants, “Determining that significance in particular contexts is . . . the crux of the quandary” (p. 14). For the past two decades, researchers have been actively engaged in analyses of safety culture, with the vast majority of work produced in engineering, management, and psychology, and a smattering of mostly critical work produced in sociology and political science. If we look across these fields, we find variation in the ways in which safety culture is invoked, although there is a great deal of conceptual importation from the social sciences to what we may think of as applied social science in engineering and management. The General Concept of Culture Culture is an actively contested concept; its importation into organizational and engineering analyses is equally contentious. Confusion derives in part from intermingling two meanings of culture: a concrete world of beliefs and practices associated with a particular group and an analytic tool of social analysis referring to a system of symbols and meanings and their associated social practices, both the product and context of social action. The analytic concept is invoked (a) to recognize signs, performances, actions, transactions, and meanings as inseparable, yet (b) “to disentangle, for the purpose of analysis [only], the semiotic influences on action from the other sorts of influences— demographic, geographical, biological, technological, economic, and so on—that they are necessarily mixed with in any concrete sequence of behavior” (Sewell 2005, p. 160). Thus, organizational culture and safety culture are terms used to emphasize that organizational and system performances are not confined to formally specified components, nor to language alone. Although formal organizational attributes and human interactions share symbolic and cognitive resources, many cultural resources are discrete, local, and intended for specific purposes. Nonetheless, it is possible (c) to observe general patterns so that we are able to speak of a culture, or cultural system, at specified scales and levels of social organization. “System and practice are complementary concepts: each presupposes the other” (Sewell 2005, p. 164),8 although the constituent practices are neither uniform, logical, static, nor autonomous. As a collection of semiotic resources deployed in interactions (Swidler 1986), “culture is not a power, something to which social events, behaviors, institutions, or processes can be causally attributed; it is a context, something within which [events, behaviors, institutions, and processes] can be intelligibly—that is, thickly—described” (Geertz 1973, p. 14). (d ) Variation and conflict concerning the meaning and use of these symbols and resources is likely and expected because at its core, culture “is an intricate system of claims about how to understand the world and act in it” (Perin 2005, p. xii; cf. Helmreich 2001). Culture as Causal Attitude For some authors, safety culture is understood as a measurable, instrumental source composed of individual attitudes and organizational behavior, or conversely as a measurable product of values, attitudes, competencies, and behaviors that are themselves the cause of other actions (Cox & Cox 1991, Geller 1994, Glennon 1982, Lee 1996, Ostrom et al. 1993). In both uses, culture “determine[s] the commitment to, and the style and proficiency of, an organization’s health and safety programs” 8“The employment of a symbol,” Sewell (2005, p. 164) writes, “can be expected to accomplish a particular goal only because symbols have more or less determinate meanings—meanings specified by their systematically structured relations to other symbols. But it is equally true that the system has no existence apart from the succession of practices that instantiate, reproduce, or—most interestingly—transform it. Hence, a system implies practice. System and practice constitute an indissoluble duality or dialectic.” 350 Silbey A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 (Reason 1997, p. 194, citing Booth, UK Health and Safety Commission 1993). Whether the first mover or an intermediate mechanism, “an ideal safety culture is the engine that continues to propel the system toward the goal of maximum safety health, regardless of the leadership’s personality or current commercial concerns” (Reason 1997, p. 195). Culture as the ultimate, intermediate, or proximate cause often leaves unspecified the particular mechanism that shapes the safe or unsafe outcomes of the organization or technology (but see Glennon 1982, Zohar 1980), with much of the management and engineering literatures debating exactly this: how to operationalize and measure both the mechanism and the outcome. Clearly, this conception of safety culture belies exactly that thick description of practice and system that cultural analysis entails (Fischer 2006, Geertz 1973, Silbey 2005b). A persistent muddle in this usage derives, in part, from the aggregation over time and across professional communities of concepts developed to name the emergent properties of social interactions not captured by the specification of components, stakeholders, objectives, functions, and resources of formal organizations. There seems to be a recurring cycle in which heretofore unnamed or unperceived phenomena are recognized as playing a role in organized action. A construct is created to name what appear to be stable, multidimensional, shared features of organized practices that had not yet been captured by existing categories and measures. All this is fine and congruent with the best sociology. However, once the phenomena are named, some researchers attempt to specify and measure them more concretely; disparate results generate continuing debate about different conceptualizations and measurement tools (Cooper 2000, Guldenmund 2000). As empirical results outpace the purportedly descriptive models, new constructs are offered to name the persistent, yet elusive effluent of unpredicted events, now hypothesized as intangible cultural causes, fueling additional debate. Thus, talk of safety culture emerged as a subset from prior talk about organizational culture (Beamish 2002; Bourrier 1996; Carroll 1998a,b; Cooper 2000; Schein 1992), and both organizational and safety culture developed alongside concepts of organizational climate and safety climate, generating a bewildering mix of concepts and measures. Numerous efforts have attempted to parse these terms, with negligible theoretical advance (Denison 1996, Zhang et al. 2002). To some extent, the conceptual puzzle is energized by occupational and professional competitions, different disciplinary communities pushing in one direction or another, using preferred concepts and tools to authorize expert advice about how to design systems, assess performance, and manage them on the basis of this information (Abbott 1988). Although organizational and safety culture can and should be normatively neutral, the terms have usually been deployed to emphasize a positive aspect of organizations, one that leads to increased safety by fostering, with minimal surveillance, an efficient and reliable workforce sensitized to safety issues. The framing generates ellipses that invite further conceptual elaboration to account for what has been excluded in the particular normative tilt of the concept. Although some authors view culture as something that can be changed—managed to improve organizational performance—and seek to develop models to generate more effective safety culture (Carroll 1998b, Cooper 2000), others adopt more disinterested formulations (Beamish 2002, O’Reilly & Chatman 1996). Guldenmund’s systematic review of the literature through 2000 describes organizational and safety culture as general frames determining organizational and safety climates. The term organizational climate was coined to refer to a global, integrating concept underlying most organizational events and processes. Nowadays, this concept is referred to by the term organizational culture whereas organizational climate has come to mean more and more the overt manifestation of culture within an organization. Therefore, climate follows naturally from culture, or, put another way, www.annualreviews.org • Safety and Culture 351 A nn u. R ev . S oc io l. 20 09 .3 5: 34 136 9. D ow nl oa de d fro m a rjo ur na ls. an nu al re vi ew s.o rg by M A SS A CH U SE TT S IN ST IT U TE O F TE CH N O LO G Y o n 09 /1 0/ 09 . F or p er so na l u se o nl y. ANRV381-SO35-17 ARI 5 June 2009 9:28 organizational culture expresses itself through organizational climate (Guldenmund 2000,
منابع مشابه
Основные Научные Труды С. Силби Profiling Facebook Users: Who Is the Open-attentive User in Facebook?
J. Locke, op. cit.: Invocations of Law on Snowy Streets. Journal of Comparative Law. Vol. 5 (2). Р. 66–91. (2011) Rotten Apples or a Rotting Barrel: How Not to Understand the Current Financial Crisis. MIT Faculty Newsletter. Vol. 21 (5). (2011) Erin Cech, Brian Rubineau, Carroll Seron and Susan Silbey. Professional Role Confidence and Gendered Persistence in Engineering. American Sociological R...
متن کاملTaming Lakatos' Monster - Computer Virus Epidemics and Internet Security Policy
A few months back I received an email inviting me to speak at WebNet 2000. I was pleasantly surprised and ready to accept until I read the proposed topic of my talk “computer security and computer viruses”. I was dumbfounded. I am a lawyer by training, teaching at a policy school and in my research I focus on information technology policy. Why on earth then do they want me to talk about compute...
متن کاملApplications of Error Data in Traffic Safety Evaluation: a Review on Our Recent Field Studies
Most efforts, projects and measures dealing with the development of new information/control strategies and technologies in vehicle equipment aim at an improvement of traffic safety (e.g. PROMETHEUS, DRIVE). Therefore, it has to be highlighted that there is a need for supplying criteria and methods for an evaluation of possible safety effects, because all measures have to be applied and approved...
متن کاملبررسی عوامل مؤثر بر فرهنگ ایمنی در کارگران صنایع فولاد استان یزد
Background: Safety culture is a factor that because of it all of workers from headmaster to simple workers come together so that they could contribute to the safety of themselves and their coworkers. The purpose of this study was to evaluate factors affecting safety culture in Yazd steel industry workers. Methods: This study was conducted on 244 steel industry workers. In this cross-sectional ...
متن کاملبررسی مولفههای فرهنگ ایمنی در صنعت هواپیمایی
Background and aims: efficient safety is not only need to establish appropriate organizational structure, rules emotion and procedures but also need to real commitment of top manager. We can find Primary organizational commitment signals about organizational safety policies in Safety Culture. Safety Culture is important factors that can cause every employer have an important role in organizat...
متن کامل